Embarrassingly Parallel Independent Training of Multi-Layer Perceptrons with Heterogeneous Architectures

نویسندگان

چکیده

In this paper we propose a procedure to enable the training of several independent Multilayer Perceptron Neural Networks with different number neurons and activation functions in parallel (ParallelMLPs) by exploring principle locality parallelization capabilities modern CPUs GPUs. The core idea technique is represent sub-networks as single large network use Modified Matrix Multiplication that replaces an ordinal matrix multiplication two simple operations allow separate paths for gradient flowing. We have assessed our algorithm simulated datasets varying samples, features batches using 10,000 models well MNIST dataset. achieved speedup from 1 4 orders magnitude if compared sequential approach. code available online.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Training Multi-layer Perceptrons Using MiniMin Approach

Multi-layer perceptrons (MLPs) have been widely used in classification and regression task. How to improve the training speed of MLPs has been an interesting field of research. Instead of the classical method, we try to train MLPs by a MiniMin model which can ensure that the weights of the last layer are optimal at each step. Significant improvement on training speed has been made using our met...

متن کامل

Understanding Dropout: Training Multi-Layer Perceptrons with Auxiliary Independent Stochastic Neurons

In this paper, a simple, general method of adding auxiliary stochastic neurons to a multi-layer perceptron is proposed. It is shown that the proposed method is a generalization of recently successful methods of dropout [5], explicit noise injection [12,3] and semantic hashing [10]. Under the proposed framework, an extension of dropout which allows using separate dropping probabilities for diffe...

متن کامل

Handwritten Digit Recognition based on Output-Independent Multi-Layer Perceptrons

With handwritten digit recognition being an established and significant problem that is facing computer vision and pattern recognition, there has been a great deal of research work that has been undertaken in this area. It is not a trivial task because of the big variation that exists in the writing styles that have been found in the available data. Therefore both, the features and the classifi...

متن کامل

Bayesian Nonlinear Independent Component Analysis by Multi-Layer Perceptrons

In this chapter, a nonlinear extension to independent component analysis is developed. The nonlinear mapping from source signals to observations is modelled by a multi-layer perceptron network and the distributions of source signals are modelled by mixture-of-Gaussians. The observations are assumed to be corrupted by Gaussian noise and therefore the method is more adequately described as nonlin...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: AI

سال: 2022

ISSN: ['2673-2688']

DOI: https://doi.org/10.3390/ai4010002